-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update batch size in LightningModule.datamodule when auto scaling batch size #3266
Conversation
Codecov Report
@@ Coverage Diff @@
## master #3266 +/- ##
=======================================
- Coverage 90% 86% -4%
=======================================
Files 90 91 +1
Lines 8158 8700 +542
=======================================
+ Hits 7362 7499 +137
- Misses 796 1201 +405 |
This pull request is now in conflict... :( |
if trainer.datamodule is not None and hasattr(trainer.datamodule, batch_arg_name): | ||
setattr(trainer.datamodule, batch_arg_name, new_size) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this necessary, should lightning_setattr
not take care of this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you mean accessing it through model.trainer.datamodule? Ok, I'll try that
@@ -186,15 +186,17 @@ def lightning_hasattr(model, attribute): | |||
attr = attribute in model.hparams | |||
else: | |||
attr = hasattr(model.hparams, attribute) | |||
elif hasattr(model.datamodule, attribute): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a bit confusing regarding the line above... mind adding a comment what is this case about...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice!
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
What does this PR do?
Fixes #3233
the new test fails on master
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃